Coordinate Descent Methods for DC Minimization: Optimality Conditions and Global Convergence

نویسندگان

چکیده

Difference-of-Convex (DC) minimization, referring to the problem of minimizing difference two convex functions, has been found rich applications in statistical learning and studied extensively for decades. However, existing methods are primarily based on multi-stage relaxation, only leading weak optimality critical points. This paper proposes a coordinate descent method class DC functions sequential nonconvex approximation. Our approach iteratively solves one-dimensional subproblem globally, it is guaranteed converge coordinate-wise stationary point. We prove that this new condition always stronger than standard point directional under mildlocally bounded nonconvexity assumption. For comparisons, we also include naive variant approximation our study. When objective function satisfies globally assumption Luo-Tseng error bound assumption, achieve Q-linear convergence rate. Also, many interest, show can be computed exactly efficiently using breakpoint searching method. Finally, have conducted extensive experiments several tasks superiority approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization

We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f (x1 , . . . , xN ) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterates to a stationary point is shown when either f is pseudoconvex in every pair of coordinate ...

متن کامل

On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

The cyclic block coordinate descent-type (CBCD-type) methods, which performs iterative updates for a few coordinates (a block) simultaneously throughout the procedure, have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications include many popular statistical machine learning methods such as elastic-net regression, ridge penalized log...

متن کامل

An Improved Convergence Analysis of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization

The cyclic block coordinate descent-type (CBCD-type) methods have shown remarkable computational performance for solving strongly convex minimization problems. Typical applications includes many popular statistical machine learning methods such as elastic-net regression, ridge penalized logistic regression, and sparse additive regression. Existing optimization literature has shown that the CBCD...

متن کامل

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (“AdaBoost problem”). We assume the input data defining the loss function is contained in a sparse m× n matrix A with at ...

متن کامل

On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods

Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in Signal Processing, Statistics and Machine Learning. Reasons for this renewed interest include the simplicity, speed, and stability of the method as well as its competitive performance on `1 regularized smooth optimization problems. Surprisingly, very little is known about its non-asymptotic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i9.26307